iterative analysis - ترجمة إلى الروسية
DICLIB.COM
أدوات لغة الذكاء الاصطناعي
أدخل كلمة أو عبارة بأي لغة 👆
اللغة:     

ترجمة وتحليل الكلمات بواسطة الذكاء الاصطناعي

في هذه الصفحة يمكنك الحصول على تحليل مفصل لكلمة أو عبارة باستخدام أفضل تقنيات الذكاء الاصطناعي المتوفرة اليوم:

  • كيف يتم استخدام الكلمة في اللغة
  • تردد الكلمة
  • ما إذا كانت الكلمة تستخدم في كثير من الأحيان في اللغة المنطوقة أو المكتوبة
  • خيارات الترجمة إلى الروسية أو الإسبانية، على التوالي
  • أمثلة على استخدام الكلمة (عدة عبارات مع الترجمة)
  • أصل الكلمة

iterative analysis - ترجمة إلى الروسية

CONVERSION OF A SET OF OBSERVATIONS OF POSSIBLY CORRELATED VARIABLES INTO A SET OF VALUES OF LINEARLY UNCORRELATED VARIABLES CALLED PRINCIPAL COMPONENTS
Principal components analysis; Hotelling transform; Principal component; Principle components; Principal components; Principle Component Analysis; Principle component analysis; Principal Component Analysis; Eigenimage; KL transform; Principle components analysis; Characteristic vector analysis; Eigenvector analysis; Principal Components Analysis; Probabilistic principal component analysis; Non-linear iterative partial least squares; NIPALS; Nonlinear iterative partial least squares; Principal-components analysis; Principal-component analysis; Conditional principal components analysis; Principal component analyses
  • Iconography of correlations - Geochemistry of marine aerosols
  • location=Paris }} (free for non-commercial use)</ref>
  • Fractional residual variance (FRV) plots for PCA and NMF;<ref name="ren18"/> for PCA, the theoretical values are the contribution from the residual eigenvalues. In comparison, the FRV curves for PCA reaches a flat plateau where no signal are captured effectively; while the NMF FRV curves are declining continuously, indicating a better ability to capture signal. The FRV curves for NMF also converges to higher levels than PCA, indicating the less-overfitting property of NMF.
  • eigenvectors]] of the [[covariance matrix]] scaled by the square root of the corresponding eigenvalue, and shifted so their tails are at the mean.
  • A principal components analysis scatterplot of [[Y-STR]] [[haplotype]]s calculated from repeat-count values for 37 Y-chromosomal STR markers from 354 individuals.<br /> PCA has successfully found linear combinations of the markers that separate out different clusters corresponding to different lines of individuals' Y-chromosomal genetic descent.

iterative analysis      

математика

повторный анализ

iteration algorithm         
NUMERICAL METHOD IN WHICH THE N-TH APPROXIMATION OF THE SOLUTION IS OBTAINED ON THE BASIS ON THE (N-1) PREVIOUS APPROXIMATIONS
Iterative methods; Krylov subspace methods; Krylov subspace method; Iterative approximation; Iteration scheme; Iterative algorithm; Iterative procedure; Iteration algorithm; Iteration methods; Iterative convergence; Iterative solver; Direct method (computational mathematics); Stationary iterative method; Iteration method

математика

итерационный алгоритм

iterative procedure         
NUMERICAL METHOD IN WHICH THE N-TH APPROXIMATION OF THE SOLUTION IS OBTAINED ON THE BASIS ON THE (N-1) PREVIOUS APPROXIMATIONS
Iterative methods; Krylov subspace methods; Krylov subspace method; Iterative approximation; Iteration scheme; Iterative algorithm; Iterative procedure; Iteration algorithm; Iteration methods; Iterative convergence; Iterative solver; Direct method (computational mathematics); Stationary iterative method; Iteration method

математика

итерационная процедура

تعريف

ТЕХНИЧЕСКИЙ АНАЛИЗ
совокупность физических, физико-химических и химических методов анализа сырья, полуфабрикатов и готовой промышленной продукции. Виды анализов, методы, техника, реактивы и т. п. регламентируются ГОСТом и техническими условиями.

ويكيبيديا

Principal component analysis

Principal component analysis (PCA) is a popular technique for analyzing large datasets containing a high number of dimensions/features per observation, increasing the interpretability of data while preserving the maximum amount of information, and enabling the visualization of multidimensional data. Formally, PCA is a statistical technique for reducing the dimensionality of a dataset. This is accomplished by linearly transforming the data into a new coordinate system where (most of) the variation in the data can be described with fewer dimensions than the initial data. Many studies use the first two principal components in order to plot the data in two dimensions and to visually identify clusters of closely related data points. Principal component analysis has applications in many fields such as population genetics, microbiome studies, and atmospheric science.

The principal components of a collection of points in a real coordinate space are a sequence of p {\displaystyle p} unit vectors, where the i {\displaystyle i} -th vector is the direction of a line that best fits the data while being orthogonal to the first i 1 {\displaystyle i-1} vectors. Here, a best-fitting line is defined as one that minimizes the average squared perpendicular distance from the points to the line. These directions constitute an orthonormal basis in which different individual dimensions of the data are linearly uncorrelated. Principal component analysis is the process of computing the principal components and using them to perform a change of basis on the data, sometimes using only the first few principal components and ignoring the rest.

In data analysis, the first principal component of a set of p {\displaystyle p} variables, presumed to be jointly normally distributed, is the derived variable formed as a linear combination of the original variables that explains the most variance. The second principal component explains the most variance in what is left once the effect of the first component is removed, and we may proceed through p {\displaystyle p} iterations until all the variance is explained. PCA is most commonly used when many of the variables are highly correlated with each other and it is desirable to reduce their number to an independent set.

PCA is used in exploratory data analysis and for making predictive models. It is commonly used for dimensionality reduction by projecting each data point onto only the first few principal components to obtain lower-dimensional data while preserving as much of the data's variation as possible. The first principal component can equivalently be defined as a direction that maximizes the variance of the projected data. The i {\displaystyle i} -th principal component can be taken as a direction orthogonal to the first i 1 {\displaystyle i-1} principal components that maximizes the variance of the projected data.

For either objective, it can be shown that the principal components are eigenvectors of the data's covariance matrix. Thus, the principal components are often computed by eigendecomposition of the data covariance matrix or singular value decomposition of the data matrix. PCA is the simplest of the true eigenvector-based multivariate analyses and is closely related to factor analysis. Factor analysis typically incorporates more domain specific assumptions about the underlying structure and solves eigenvectors of a slightly different matrix. PCA is also related to canonical correlation analysis (CCA). CCA defines coordinate systems that optimally describe the cross-covariance between two datasets while PCA defines a new orthogonal coordinate system that optimally describes variance in a single dataset. Robust and L1-norm-based variants of standard PCA have also been proposed.

What is the الروسية for iterative analysis? Translation of &#39iterative analysis&#39 to الروسية